ShiftDelete.Net Global

AI’s water dilemma: Training consumes shocking amounts

Ana sayfa / AI

Large language models (LLMs) like OpenAI’s ChatGPT and Google’s Bard are energy intensive, requiring massive server farms for training. Cooling these data centers also makes the AI chatbots incredibly thirsty. New research suggests that training GPT-3 alone consumed 185,000 gallons (700,000 liters) of water. The chatbot’s rising popularity worries researchers, as the spilled bottles may significantly impact water supplies. This concern is especially relevant during historic droughts and environmental uncertainty in the US.

Addressing the water footprint

Researchers at the University of Colorado Riverside and the University of Texas Arlington published AI water consumption estimates in a pre-print paper. The paper, titled “Making AI Less ‘Thirsty,'” reveals GPT-3 training’s significant water requirements. They found that training GPT-3 needs as much clear freshwater as filling a nuclear reactor’s cooling tower.

The researchers also discussed future AI models and their water consumption. They anticipate that water requirements will continue to rise with newer models like GPT-4. These newer models depend on a larger set of data parameters compared to earlier versions.

When calculating AI’s water consumption, the researchers draw a distinction between water “withdrawal” and “consumption.” Maintaining ideal temperatures in data centers requires immense amounts of water, which is often consumed through evaporation in cooling towers. Water consumption issues aren’t limited to OpenAI or AI models; Google requested more than 2.3 billion gallons of water for data centers in just three states in 2019.

Climate change and worsening droughts could amplify concerns over AI’s water usage. Researchers say there are some relatively clear ways to bring AI’s water price tag down, such as training models during cooler hours or in more water-efficient data centers.

However, these demand-side changes will require greater transparency on the part of tech companies building these models, something the researchers say is worryingly scarce. What is your opinion for this topic? Please share your comments with us!

Yorum Ekleyin